4 research outputs found

    Predicting Academic Performance: A Systematic Literature Review

    Get PDF
    The ability to predict student performance in a course or program creates opportunities to improve educational outcomes. With effective performance prediction approaches, instructors can allocate resources and instruction more accurately. Research in this area seeks to identify features that can be used to make predictions, to identify algorithms that can improve predictions, and to quantify aspects of student performance. Moreover, research in predicting student performance seeks to determine interrelated features and to identify the underlying reasons why certain features work better than others. This working group report presents a systematic literature review of work in the area of predicting student performance. Our analysis shows a clearly increasing amount of research in this area, as well as an increasing variety of techniques used. At the same time, the review uncovered a number of issues with research quality that drives a need for the community to provide more detailed reporting of methods and results and to increase efforts to validate and replicate work.Peer reviewe

    An Online Peer-Assessment Methodology for Improved Student Engagement and Early Intervention

    Get PDF
    Student performance is commonly measured using summative assessment methods such as midterms and final exams as well as high-stakes testing. Although not as common, there are other methods of gauging student performance. Formative assessment is a continuous, student-oriented form of assessment, which focuses on helping students improve their performance through continuous engagement and constant measurement of progress. One assessment practice that has been in use for decades in such a manner is peer-assessment. This form of assessment relies on having students evaluate the works of their peers. The level of education in which peer-assessment is used may vary across practices. The research discussed here was conducted in a higher education setting. Despite its cross-domain adoption and longevity, peer-assessment has been a practice difficult to utilize in courses with a high number of students. This directly stems from the fact that it has been used in traditional classes, where assessment is usually carried out using pen and paper. In courses with hundreds of students, such manual forms of peer-assessment would require a significant amount of time to complete. They would also contribute much to both student and instructor load. Automated peer-assessment, on the other hand, has the advantage of reducing, if not eliminating, many of the issues relating to efficiency and effectiveness of the practice. Moreover, its potential to scale up easily makes it a promising platform for conducting large-scale experiments or replicating existing ones. The goal of this thesis is to examine how the potential of automated peer-assessment may be exploited to improve student engagement and to demonstrate how a well-designed peer-assessment methodology may help teachers identify at-risk students in a timely manner. A methodology is developed to demonstrate how online peer-assessment may elicit continuous student engagement. Data collected from a web-based implementation of this methodology are then used to construct several models that predict student performance and monitor progress, highlighting the role of peer-assessment as a tool of early intervention. The construction of open datasets from online peer-assessment data gathered from five undergraduate computer science courses is discussed. Finally, a promising role of online peer-assessment in measuring levels of student proficiency and test item difficulty is demonstrated by applying a generic Item Response Theory model to the peer-assessment data

    Controlled outputs, full data: A privacy-protecting infrastructure for MOOC data

    Full text link
    Learning analytics research presents challenges for researchers embracing the principles of open science. Protecting student privacy is paramount, but progress in increasing scientific understanding and improving educational outcomes depends upon open, scalable and replicable research. Findings have repeatedly been shown to be contextually dependent on personal and demographic variables, so how can we use this data in a manner that is ethical and secure for all involved? This paper presents ongoing work on the MOOC Replication Framework (MORF), a big data repository and analysis environment for Massive Open Online Courses (MOOCs). We discuss MORF’s approach to protecting student privacy, which allows researchers to use data without having direct access. Through an open API, documentation and tightly controlled outputs, this framework provides researchers with the opportunity to perform secure, scalable research and facilitates collaboration, replication, and novel research. We also highlight ways in which MORF represents a solution template to issues surrounding privacy and security in the age of big data in education and key challenges still to be tackled.Practitioner notesWhat is already known about this topicPersonal Identifying Information (PII) has many valid and important research uses in education.The ability to replicate or build on analyses is important to modern educational research, and is usually enabled through sharing data.Data sharing generally does not involve PII in order to protect student privacy.MOOCs present a rich data source for education researchers to better understand online learning.What this paper addsThe MOOC replication framework (MORF) 2.1 is a new infrastructure that enables researchers to conduct analyses on student data without having direct access to the data, thus protecting student privacy.Detail of the MORF 2.1 structure and workflow.Implications for practice and/or policyMORF 2.1 is available for use by practitioners and research with policy implications.The infrastructure and approach in MORF could be applied to other types of educational data.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/172979/1/bjet13231_am.pdfhttp://deepblue.lib.umich.edu/bitstream/2027.42/172979/2/bjet13231.pd
    corecore